How to Download File Using Wget in Linux

您所在的位置:网站首页 wget download failed How to Download File Using Wget in Linux

How to Download File Using Wget in Linux

#How to Download File Using Wget in Linux | 来源: 网络整理| 查看: 265

wget is a Linux command-line utility used to download files and web pages from the internet. It supports several internet protocols including HTTP, HTTPS, and FTP to access and retrieve the files. You can use different options with the wget command to perform a variety tasks. The wget utility is very useful on slower and unstable internet connections. The wget command keeps trying until the entire file has been retrieved in the event of a download failure.

Features

Downloads multiple files at once. Define download speed and bandwidth limit. Allows resuming a failed download. Download complete web and FTP sites. Support for SSL/TLS for encrypted downloads. Download files through proxies.

In this post, we will show you how to use the wget command to download files in Linux.

Prerequisites A server running Linux on the Atlantic.Net Cloud Platform A root password configured on your server Create Atlantic.Net Cloud Server

First, log in to your Atlantic.Net Cloud Server. Create a new server, choosing any Linux operating system with at least 1GB RAM. Connect to your Cloud Server via SSH and log in using the credentials highlighted at the top of the page.

Install wget in Linux

By default, the wget package comes pre-installed on all major Linux operating systems. If not installed, you can follow the below steps to install it.

For Debian and Ubuntu operating systems, install wget using the following command:

apt-get install wget -y

For RHEL, CentOS, Rocky Linux, and Fedora operating systems, install wget using the following command:

dnf install wget -y Also Read

How to Install LFTP to Download and Upload Files in Linux

Download a File with wget Command

The basic syntax to download a file using the wget command is shown below:

wget [option] [URL]

To download a single file, run the following command:

wget https://repo.skype.com/latest/skypeforlinux-64.deb

This will download a file named skypeforlinux-64.deb to your current working directory:

Connecting to get.skype.com (get.skype.com)|52.174.193.75|:443... connected. HTTP request sent, awaiting response... 302 Found Location: https://repo.skype.com/latest/skypeforlinux-64.deb [following] --2022-03-16 08:45:16-- https://repo.skype.com/latest/skypeforlinux-64.deb Resolving repo.skype.com (repo.skype.com)... 23.50.252.171, 2405:200:1630:18af::1263, 2405:200:1630:189c::1263 Connecting to repo.skype.com (repo.skype.com)|23.50.252.171|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 140790360 (134M) [application/x-debian-package] Saving to: ‘skypeforlinux-64.deb’ skypeforlinux-64.deb 100%[=================================================================>] 134.27M 2.10MB/s in 64s 2022-03-16 08:46:20 (2.09 MB/s) - ‘skypeforlinux-64.deb’ saved [140790360/140790360] Download Multiple Files Using wget Command

You can also download multiple files at once by specifying multiple URLs with the wget command.

wget http://ftp.gnu.org/gnu/wget/wget-1.11.4.tar.gz https://curl.se/download/curl-7.82.0.tar.gz

This will download multiple files in your current working directory:

HTTP request sent, awaiting response... 200 OK Length: 1475149 (1.4M) [application/x-gzip] Saving to: ‘wget-1.11.4.tar.gz’ wget-1.11.4.tar.gz 100%[=================================================================>] 1.41M 293KB/s in 4.9s 2022-03-16 08:51:01 (293 KB/s) - ‘wget-1.11.4.tar.gz’ saved [1475149/1475149] --2022-03-16 08:51:01-- https://curl.se/download/curl-7.82.0.tar.gz Resolving curl.se (curl.se)... 151.101.2.49, 151.101.66.49, 151.101.130.49, ... Connecting to curl.se (curl.se)|151.101.2.49|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 4106857 (3.9M) [application/x-gzip] Saving to: ‘curl-7.82.0.tar.gz’ curl-7.82.0.tar.gz 100%[=================================================================>] 3.92M 3.26MB/s in 1.2s 2022-03-16 08:51:04 (3.26 MB/s) - ‘curl-7.82.0.tar.gz’ saved [4106857/4106857] FINISHED --2022-03-16 08:51:04-- Total wall clock time: 15s Downloaded: 2 files, 5.3M in 6.1s (891 KB/s) Download the File with a Different Name

Generally, the wget command will download and save a file with its original name. You can use the -O option with the wget command to save a file with a different name.

wget -O wget.tar.gz http://ftp.gnu.org/gnu/wget/wget-1.11.4.tar.gz

This will download the file and save it with a different name:

Resolving ftp.gnu.org (ftp.gnu.org)... 209.51.188.20, 2001:470:142:3::b Connecting to ftp.gnu.org (ftp.gnu.org)|209.51.188.20|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 1475149 (1.4M) [application/x-gzip] Saving to: ‘wget.tar.gz’ wget.tar.gz 100%[=================================================================>] 1.41M 231KB/s in 6.2s 2022-03-16 08:55:27 (231 KB/s) - ‘wget.tar.gz’ saved [1475149/1475149] Also Read

How to Install and Use Pigz to Compress File in Linux

Download the File in a Specified Location

By default, the wget command downloads the file in your current working directory. You can use the -P option to download the file in a specified directory:

wget -P /tmp/ http://ftp.gnu.org/gnu/wget/wget-1.11.4.tar.gz

Output:

Connecting to ftp.gnu.org (ftp.gnu.org)|209.51.188.20|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 1475149 (1.4M) [application/x-gzip] Saving to: ‘/tmp/wget-1.11.4.tar.gz.1’ wget-1.11.4.tar.gz.1 100%[=================================================================>] 1.41M 227KB/s in 6.3s 2022-03-16 08:58:07 (227 KB/s) - ‘/tmp/wget-1.11.4.tar.gz.1’ saved [1475149/1475149] Download Multiple Files From a File

You can use the -i option to specify a file containing a list of URLs to be downloaded. First, create a file named download-list.txt:

nano download-list.txt

Add all URLs from where you want to download files:

http://ftp.gnu.org/gnu/wget/wget-1.11.4.tar.gz http://www.openssl.org/source/openssl-0.9.8h.tar.gz https://curl.se/download/curl-7.82.0.tar.gz

Save and close the file, then run the wget command by specifying download-list.txt file as shown below:

wget -i download-list.txt

This will download all files one by one as shown below:

Length: 1475149 (1.4M) [application/x-gzip] Saving to: ‘wget-1.11.4.tar.gz’ wget-1.11.4.tar.gz 100%[=================================================================>] 1.41M 142KB/s in 10s 2022-03-16 09:06:19 (143 KB/s) - ‘wget-1.11.4.tar.gz’ saved [1475149/1475149] Length: 3439981 (3.3M) [application/x-gzip] Saving to: ‘openssl-0.9.8h.tar.gz’ openssl-0.9.8h.tar.gz 100%[=================================================================>] 3.28M 4.44MB/s in 0.7s 2022-03-16 09:06:21 (4.44 MB/s) - ‘openssl-0.9.8h.tar.gz’ saved [3439981/3439981] Length: 4106857 (3.9M) [application/x-gzip] Saving to: ‘curl-7.82.0.tar.gz’ curl-7.82.0.tar.gz 100%[=================================================================>] 3.92M 4.50MB/s in 0.9s 2022-03-16 09:06:22 (4.50 MB/s) - ‘curl-7.82.0.tar.gz’ saved [4106857/4106857] FINISHED --2022-03-16 09:06:22-- Total wall clock time: 13s Downloaded: 3 files, 8.6M in 12s (755 KB/s) Resume Uncompleted Download Using wget Command

If you are trying to download a large file and it stops due to a slow or unstable internet connection, then, you can use the -c option to resume downloading the same file where it left off.

wget -c https://curl.se/download/curl-7.82.0.tar.gz

Output:

--2022-03-16 09:11:11-- https://curl.se/download/curl-7.82.0.tar.gz Resolving curl.se (curl.se)... 151.101.2.49, 151.101.66.49, 151.101.130.49, ... Connecting to curl.se (curl.se)|151.101.2.49|:443... connected. HTTP request sent, awaiting response... 206 Partial Content Length: 4106857 (3.9M), 753031 (735K) remaining [application/x-gzip] Saving to: ‘curl-7.82.0.tar.gz’ curl-7.82.0.tar.gz 100%[+++++++++++++++++++++++++++++++++++++++++++++++++++++============>] 3.92M 4.23MB/s in 0.2s 2022-03-16 09:11:11 (4.23 MB/s) - ‘curl-7.82.0.tar.gz’ saved [4106857/4106857] Ignore SSL Certificate Check While Downloading

You can use the –no-check-certificate option to ignore SSL certificate checks while downloading files over HTTPS.

wget --no-check-certificate https://curl.se/download/curl-7.82.0.tar.gz

Output:

--2022-03-16 09:14:07-- https://curl.se/download/curl-7.82.0.tar.gz Resolving curl.se (curl.se)... 151.101.2.49, 151.101.66.49, 151.101.130.49, ... Connecting to curl.se (curl.se)|151.101.2.49|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 4106857 (3.9M) [application/x-gzip] Saving to: ‘curl-7.82.0.tar.gz’ curl-7.82.0.tar.gz 100%[=================================================================>] 3.92M 4.40MB/s in 0.9s 2022-03-16 09:14:08 (4.40 MB/s) - ‘curl-7.82.0.tar.gz’ saved [4106857/4106857] Set Download Speed Limit

The wget command will also allow you to set a speed limit on a slower internet connection. You can use the –limit-rate option to set the speed limit.

For example, to set the speed limit to 10k, run the following command:

wget -c --limit-rate=10k https://curl.se/download/curl-7.82.0.tar.gz

You should see the following output:

--2022-03-16 09:17:36-- https://curl.se/download/curl-7.82.0.tar.gz Resolving curl.se (curl.se)... 151.101.2.49, 151.101.66.49, 151.101.130.49, ... Connecting to curl.se (curl.se)|151.101.2.49|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 4106857 (3.9M) [application/x-gzip] Saving to: ‘curl-7.82.0.tar.gz’ curl-7.82.0.tar.gz 3%[=> ] 159.22K 10.0KB/s eta 6m 26s Download a File From Password Protected FTP Server

The wget command also allows you to specify a username and password if you want to download a file from the password-protected FTP server.

You can use the following syntax to download a file from the password-protected FTP server:

wget --ftp-user=username --ftp-password=password ftp://ftp.server.com/filename.tar.gz Create a Mirror of Entire Website

You can use the -m option to create a complete local copy of the website by downloading all website files including CSS, JavaScript, and images.

wget -m https://linuxbuz.com

If you want to use the downloaded website for local browsing, you can use some extra options with the wget command:

wget -m -k -p https://linuxbuz.com Conclusion

In this post, we explained how to use the wget command with different options to download files from the remote URLs. You can now use the wget command according to your needs. Try it on dedicated servers from Atlantic.Net!



【本文地址】


今日新闻


推荐新闻


    CopyRight 2018-2019 办公设备维修网 版权所有 豫ICP备15022753号-3